61 research outputs found

    FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data

    Get PDF
    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories

    Eigenspectrum bounds for semirandom matrices with modular and spatial structure for neural networks

    Get PDF
    The eigenvalue spectrum of the matrix of directed weights defining a neural network model is informative of several stability and dynamical properties of network activity. Existing results for eigenspectra of sparse asymmetric random matrices neglect spatial or other constraints in determining entries in these matrices, and so are of partial applicability to cortical-like architectures. Here we examine a parameterized class of networks that are defined by sparse connectivity, with connection weighting modulated by physical proximity (i.e., asymmetric Euclidean random matrices), modular network partitioning, and functional specificity within the excitatory population. We present a set of analytical constraints that apply to the eigenvalue spectra of associated weight matrices, highlighting the relationship between connectivity rules and classes of network dynamics

    From Neural Arbors to Daisies

    Get PDF
    Pyramidal neurons in layers 2 and 3 of the neocortex collectively form an horizontal lattice of long-range, periodic axonal projections, known as the superficial patch system. The precise pattern of projections varies between cortical areas, but the patch system has nevertheless been observed in every area of cortex in which it has been sought, in many higher mammals. Although the clustered axonal arbors of single pyramidal cells have been examined in detail, the precise rules by which these neurons collectively merge their arbors remain unknown. To discover these rules, we generated models of clustered axonal arbors following simple geometric patterns. We found that models assuming spatially aligned but independent formation of each axonal arbor do not produce patchy labeling patterns for large simulated injections into populations of generated axonal arbors. In contrast, a model that used information distributed across the cortical sheet to generate axonal projections reproduced every observed quality of cortical labeling patterns. We conclude that the patch system cannot be built during development using only information intrinsic to single neurons. Information shared across the population of patch-projecting neurons is required for the patch system to reach its adult stat

    Network insensitivity to parameter noise via adversarial regularization

    Full text link
    Neuromorphic neural network processors, in the form of compute-in-memory crossbar arrays of memristors, or in the form of subthreshold analog and mixed-signal ASICs, promise enormous advantages in compute density and energy efficiency for NN-based ML tasks. However, these technologies are prone to computational non-idealities, due to process variation and intrinsic device physics. This degrades the task performance of networks deployed to the processor, by introducing parameter noise into the deployed model. While it is possible to calibrate each device, or train networks individually for each processor, these approaches are expensive and impractical for commercial deployment. Alternative methods are therefore needed to train networks that are inherently robust against parameter variation, as a consequence of network architecture and parameters. We present a new adversarial network optimisation algorithm that attacks network parameters during training, and promotes robust performance during inference in the face of parameter variation. Our approach introduces a regularization term penalising the susceptibility of a network to weight perturbation. We compare against previous approaches for producing parameter insensitivity such as dropout, weight smoothing and introducing parameter noise during training. We show that our approach produces models that are more robust to targeted parameter variation, and equally robust to random parameter variation. Our approach finds minima in flatter locations in the weight-loss landscape compared with other approaches, highlighting that the networks found by our technique are less sensitive to parameter perturbation. Our work provides an approach to deploy neural network architectures to inference devices that suffer from computational non-idealities, with minimal loss of performance. ..

    Developmental Origin of Patchy Axonal Connectivity in the Neocortex: A Computational Model

    Get PDF
    Injections of neural tracers into many mammalian neocortical areas reveal a common patchy motif of clustered axonal projections. We studied in simulation a mathematical model for neuronal development in order to investigate how this patchy connectivity could arise in layer II/III of the neocortex. In our model, individual neurons of this layer expressed the activator-inhibitor components of a Gierer-Meinhardt reaction-diffusion system. The resultant steady-state reaction-diffusion pattern across the neuronal population was approximately hexagonal. Growth cones at the tips of extending axons used the various morphogens secreted by intrapatch neurons as guidance cues to direct their growth and invoke axonal arborization, so yielding a patchy distribution of arborization across the entire layer II/III. We found that adjustment of a single parameter yields the intriguing linear relationship between average patch diameter and interpatch spacing that has been observed experimentally over many cortical areas and species. We conclude that a simple Gierer-Meinhardt system expressed by the neurons of the developing neocortex is sufficient to explain the patterns of clustered connectivity observed experimentall

    From Neural Arbors to Daisies

    Get PDF
    Pyramidal neurons in layers 2 and 3 of the neocortex collectively form an horizontal lattice of long-range, periodic axonal projections, known as the superficial patch system. The precise pattern of projections varies between cortical areas, but the patch system has nevertheless been observed in every area of cortex in which it has been sought, in many higher mammals. Although the clustered axonal arbors of single pyramidal cells have been examined in detail, the precise rules by which these neurons collectively merge their arbors remain unknown. To discover these rules, we generated models of clustered axonal arbors following simple geometric patterns. We found that models assuming spatially aligned but independent formation of each axonal arbor do not produce patchy labeling patterns for large simulated injections into populations of generated axonal arbors. In contrast, a model that used information distributed across the cortical sheet to generate axonal projections reproduced every observed quality of cortical labeling patterns. We conclude that the patch system cannot be built during development using only information intrinsic to single neurons. Information shared across the population of patch-projecting neurons is required for the patch system to reach its adult state

    Evidence that breast cancer risk at the 2q35 locus is mediated through IGFBP5 regulation.

    Get PDF
    GWAS have identified a breast cancer susceptibility locus on 2q35. Here we report the fine mapping of this locus using data from 101,943 subjects from 50 case-control studies. We genotype 276 SNPs using the 'iCOGS' genotyping array and impute genotypes for a further 1,284 using 1000 Genomes Project data. All but two, strongly correlated SNPs (rs4442975 G/T and rs6721996 G/A) are excluded as candidate causal variants at odds against >100:1. The best functional candidate, rs4442975, is associated with oestrogen receptor positive (ER+) disease with an odds ratio (OR) in Europeans of 0.85 (95% confidence interval=0.84-0.87; P=1.7 × 10(-43)) per t-allele. This SNP flanks a transcriptional enhancer that physically interacts with the promoter of IGFBP5 (encoding insulin-like growth factor-binding protein 5) and displays allele-specific gene expression, FOXA1 binding and chromatin looping. Evidence suggests that the g-allele confers increased breast cancer susceptibility through relative downregulation of IGFBP5, a gene with known roles in breast cell biology

    TRY plant trait database – enhanced coverage and open access

    Get PDF
    Plant traits - the morphological, anatomical, physiological, biochemical and phenological characteristics of plants - determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait‐based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits - almost complete coverage for ‘plant growth form’. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait–environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives

    Anatomical constraints on lateral competition in columnar cortical architectures

    Get PDF
    Competition is a well-studied and powerful mechanism for information processing in neuronal networks, providing noise rejection, signal restoration, decision making and associative memory properties, with relatively simple requirements for network architecture. Models based on competitive interactions have been used to describe the shaping of functional properties in visual cortex, as well as the development of functional maps in columnar cortex. These models require competition within a cortical area to occur on a wider spatial scale than cooperation, usually implemented by lateral inhibitory connections having a longer range than local excitatory connections. However, measurements of cortical anatomy reveal that the spatial extent of inhibition is in fact more restricted than that of excitation. Relatively few models reflect this, and it is unknown whether lateral competition can occur in cortical-like networks that have a realistic spatial relationship between excitation and inhibition. Here we analyze simple models for cortical columns and perform simulations of larger models to show how the spatial scales of excitation and inhibition can interact to produce competition through disynaptic inhibition. Our findings give strong support to the direct coupling effect—that the presence of competition across the cortical surface is predicted well by the anatomy of direct excitatory and inhibitory coupling and that multisynaptic network effects are negligible. This implies that for networks with short-range inhibition and longer-range excitation, the spatial extent of competition is even narrower than the range of inhibitory connections. Our results suggest the presence of network mechanisms that focus on intra-rather than intercolumn competition in neocortex, highlighting the need for both new models and direct experimental characterizations of lateral inhibition and competition in columnar cortex

    Supervised training of spiking neural networks for robust deployment on mixed-signal neuromorphic processors

    Full text link
    Mixed-signal analog/digital circuits emulate spiking neurons and synapses with extremely high energy efficiency, an approach known as "neuromorphic engineering". However, analog circuits are sensitive to process-induced variation among transistors in a chip ("device mismatch"). For neuromorphic implementation of Spiking Neural Networks (SNNs), mismatch causes parameter variation between identically-configured neurons and synapses. Each chip exhibits a different distribution of neural parameters, causing deployed networks to respond differently between chips. Current solutions to mitigate mismatch based on per-chip calibration or on-chip learning entail increased design complexity, area and cost, making deployment of neuromorphic devices expensive and difficult. Here we present a supervised learning approach that produces SNNs with high robustness to mismatch and other common sources of noise. Our method trains SNNs to perform temporal classification tasks by mimicking a pre-trained dynamical system, using a local learning rule from non-linear control theory. We demonstrate our method on two tasks requiring memory, and measure the robustness of our approach to several forms of noise and mismatch. We show that our approach is more robust than common alternatives for training SNNs. Our method provides robust deployment of pre-trained networks on mixed-signal neuromorphic hardware, without requiring per-device training or calibration
    corecore